Goto

Collaborating Authors

 child abuse image


Apple defends scanning iPhones for child abuse images, saying algorithm only identifies flagged pics

Daily Mail - Science & tech

Apple is pushing back against criticism over its plan to scan photos on users iPhones and in iCloud storage in search of child sexual abuse images. In a Frequently Asked Questions document focusing on its'Expanded Protections for Children,' Apple insisted its system couldn't be exploited to seek out images related to anything other than child sexual abuse material (CSAM). The system will not scan photo albums, Apple says, but rather looks for matches based on a database of'hashes' - a type of digital fingerprint - of known CSAM images provided by child safety organizations. While privacy advocacies worry about'false positives, Apple boasted that'the likelihood that the system would incorrectly flag any given account is less than one in one trillion per year.' Apple also claims it would'refuse any such demands' from government agencies, in the US or abroad.


AI tool detects child abuse images with 99% accuracy

#artificialintelligence

A new AI-powered tool claims to detect child abuse images with around 99 percent accuracy. The tool, called Safer, is developed by non-profit Thorn to assist businesses which do not have in-house filtering systems to detect and remove such images. According to the Internet Watch Foundation in the UK, reports of child abuse images surged 50 percent during the COVID-19 lockdown. In the 11 weeks starting on 23rd March, its hotline logged 44,809 reports of images compared with 29,698 last year. Many of these images are from children who've spent more time online and been coerced into releasing images of themselves.


Search engine Bing is showing child PORNOGRAPHY

Daily Mail - Science & tech

Microsoft's Bing search engine shows results for sickening child pornography images, research has found. The disturbing revelation discovered that it was easy to find illegal photos of under-age boys and girls on the site. Image searches for'porn kids,' 'porn CP' (a known abbreviation for'child pornography') and'nude family kids' all produced the exploitative content. People looking for the horrific content only needed to turn off SafeSearch filter to find the imagery. An investigation commissioned by TechCrunch found that Bing also suggested other disturbing phrases to help paedophiles target children.